Comment: Lancaster Probabilities and Gibbs Sampling
نویسنده
چکیده
It is a pleasure to congratulate the authors for this excellent, original and pedagogical paper. I read a preliminary draft at the end of 2006 and I then mentioned to the authors that their work should be set within the framework of Lancaster probabilities, a remoted corner of the theory of probability, now described in their Section 6.1. The reader is referred to Lancaster (1958, 1963, 1975) and the synthesis by Koudou (1995, 1996) for more details. Given probabilities μ(dx) and ν(dy) on spaces X and Y , and given orthonormal bases p= (pn(x)) and q = (qn(y)) of L 2(μ) and L2(ν), a probability σ on X × Y is said to be of the Lancaster type if either there exists a sequence ρ= (ρn) in l 2 such that
منابع مشابه
Marginalizing Out Transition Probabilities for Several Subclasses of PFAs
A Bayesian manner which marginalizes transition probabilities can be generally applied to various kinds of probabilistic finite state machine models. Based on such a Bayesian manner, we implemented and compared three algorithms: variable-length gram, state merging method for PDFAs, and collapsed Gibbs sampling for PFAs. Among those, collapsed Gibbs sampling for PFAs performed the best on the da...
متن کاملEvaluation of per-record identification risk by additive modeling of interaction for contingency table cell probabilities
We propose to fit a Lancaster-type additive model of interaction terms for cell probabilities of contingency tables to evaluate the conditional probability of population uniqueness of sample unique records in microdata sets. Moment estimation of the Lancaster-type additive model is straightforward and the proposed estimation procedure is intuitively appealing from the viewpoint of disclosure ri...
متن کاملAdaptive Gibbs samplers
We consider various versions of adaptive Gibbs and Metropoliswithin-Gibbs samplers, which update their selection probabilities (and perhaps also their proposal distributions) on the fly during a run, by learning as they go in an attempt to optimise the algorithm. We present a cautionary example of how even a simple-seeming adaptive Gibbs sampler may fail to converge. We then present various pos...
متن کاملGibbs Sampling in Probabilistic Description Logics with Deterministic Dependencies
In many applications there is interest in representing both probabilistic and deterministic dependencies. This is especially the case in applications using Description Logics (DLs), where ontology engineering usually is based on strict knowledge, while there is also the need to represent uncertainty. We introduce a Markovian style of probabilistic reasoning in first-order logic known as Markov ...
متن کاملLearning Conditional Probabilities from Incomplete Data: An Experimental Comparison
This paper compares three methods | em algorithm, Gibbs sampling, and Bound and Collapse (bc) | to estimate conditional probabilities from incomplete databases in a controlled experiment. Results show a substantial equivalence of the estimates provided by the three methods and a dramatic gain in e ciency using bc. Reprinted from: Proceedings of Uncertainty 99: Seventh International Workshop on ...
متن کامل